LCRM: Layer-wise Complexity Reduction Method for CNN Model Optimization on End Devices
نویسندگان
چکیده
The increasing significance of state-of-the-art convolutional neural network (CNN) models in computer vision tasks has led to their widespread use industry and academia. However, deploying these resource-limited environments, such as IoT devices or embedded GPUs, presents challenges due increased complexities resource consumption. This research paper proposes an optimization algorithm called Layer-wise Complexity Reduction Method (LCRM) address by converting accuracy-focused CNNs into lightweight models. It evaluates the standard convolution layers replaces them with most efficient combination substitutional convolutions based on output channel size. primary goal is reduce computational complexity parent hardware requirements. We assess effectiveness our framework evaluating its performance various CNN models, including AlexNet, VGG-9, U-Net, Retinex-Net, for different applications image classification, optical character recognition, segmentation, enhancement. Our experimental results show up a 95% reduction inference latency 93% energy consumption when deployed GPU. Furthermore, we compare LCRM-optimized methods, pruning, quantization, clustering, four cascaded Raspberry Pi-4. profiling experiments performed each model demonstrate that achieve comparable better accuracy than while providing added benefits 62.84% end significant memory compression reductions.
منابع مشابه
NDDR-CNN: Layer-wise Feature Fusing in Multi-Task CNN by Neural Discriminative Dimensionality Reduction
State-of-the-art Convolutional Neural Network (CNN) benefits much from multi-task learning (MTL), which learns multiple related tasks simultaneously to obtain shared or mutually related representations for different tasks. The most widely used MTL CNN structure is based on an empirical or heuristic split on a specific layer (e.g., the last convolutional layer) to minimize multiple task-specific...
متن کاملEffective End-to-End Link Capacity Measurement for Layer-2 and Layer-3 Devices
Measuring link capacity on the network will benefit for the success of streaming applications such as video streaming, videoconference, content adaptation and dynamic server selection etc. This paper studies a novel effective active algorithm for end-to-end link capacity measurement through integrating packet trains and packet pair approach. The novelty and contributions of our algorithm, as co...
متن کاملSingle-Layer CNN Simulator
An efficient behavioral simulator for Cellular Neural Networks (CNN) is presented in this paper. The simulator is capable of performing Single-Layer CNN simulations for any size of input image, thus a powerful tool for researchers investigating potential applications of CNN. This paper reports an efficient algorithm exploiting the latency properties of Cellular Neural Networks along with numeri...
متن کاملAn Efficient Method for Model Reduction in Diffuse Optical Tomography
We present an efficient method for the reduction of model equations in the linearized diffuse optical tomography (DOT) problem. We first implement the maximum a posteriori (MAP) estimator and Tikhonov regularization, which are based on applying preconditioners to linear perturbation equations. For model reduction, the precondition is split into two parts: the principal components are consid...
متن کاملAnatomical Data Augmentation For CNN based Pixel-wise Classification
In this work we propose a method for anatomical data augmentation that is based on using slices of computed tomography (CT) examinations that are adjacent to labeled slices as another resource of labeled data for training the network. The extended labeled data is used to train a U-net network for a pixel-wise classification into different hepatic lesions and normal liver tissues. Our dataset co...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2023
ISSN: ['2169-3536']
DOI: https://doi.org/10.1109/access.2023.3290620